Neuronal activity arises from an interaction between ongoing firing generatedspontaneously by neural circuits and responses driven by external stimuli.Using mean-field analysis, we ask how a neural network that intrinsicallygenerates chaotic patterns of activity can remain sensitive to extrinsic input.We find that inputs not only drive network responses, they also activelysuppress ongoing activity, ultimately leading to a phase transition in whichchaos is completely eliminated. The critical input intensity at the phasetransition is a non-monotonic function of stimulus frequency, revealing a"resonant" frequency at which the input is most effective at suppressing chaoseven though the power spectrum of the spontaneous activity peaks at zero andfalls exponentially. A prediction of our analysis is that the variance ofneural responses should be most strongly suppressed at frequencies matching therange over which many sensory systems operate.
展开▼